Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[chakra][et_converter] Add Annotation for Graph Trace Observer in PyTorch #16

Merged
merged 1 commit into from
Jan 10, 2024

Conversation

JoongunPark
Copy link
Contributor

@JoongunPark JoongunPark commented Jan 8, 2024

Summary

  • This has been tested with the latest Chakra on an Ubuntu 22.04 machine.
  • The traces are from Resnet-50, distributed data parallel, with a Graph Trace Observer.
  • This PR adds annotations for handling the traces.

Test Plan

Chakra

python3 -m chakra.et_converter.et_converter --input_type PyTorch --input_filename result/eg.rank_0.pt.trace_plus.json --output_filename et_plus/chakra.0.et --num_dims 1
python3 -m chakra.et_converter.et_converter --input_type PyTorch --input_filename result/eg.rank_1.pt.trace_plus.json --output_filename et_plus/chakra.1.et --num_dims 1

Astra-sim

./build/astra_analytical/build/bin/AstraSim_Analytical_Congestion_Unaware --workload-configuration=./extern/graph_frontend/chakra/et_plus/chakra --system-configuration=./inputs/system/FullyConnected.json --network-configuration=./inputs/network/analytical/FullyConnected.yml --remote-memory-configuration=./inputs/remote_memory/analytical/no_memory_expansion.json

- This has been tested with the latest Chakra in Ubuntu 22.04 machine.
- The traces are from Resnet-50 distributed data parallel with Graph trace observer.
@JoongunPark JoongunPark requested a review from a team as a code owner January 8, 2024 21:21
Copy link

github-actions bot commented Jan 8, 2024

MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅

@srinivas212 srinivas212 merged commit 8f0edff into mlcommons:main Jan 10, 2024
3 checks passed
@github-actions github-actions bot locked and limited conversation to collaborators Jan 10, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants